Learning directed probabilistic logical models from relational data

نویسنده

  • Daan Fierens
چکیده

Data that has a complex relational structure and in which observations are noisy or partially missing poses several challenges to traditional machine learning algorithms. One solution to this problem is the use of socalled probabilistic logical models (models that combine elements of first-order logic with probabilities) and corresponding learning algorithms. In this thesis we focus on directed probabilistic logical models. We show how to represent such models and develop several algorithms to learn such models from data.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Directed Probabilistic Logical Models Using Ordering-Search

There is an increasing interest in upgrading Bayesian networks to the relational case, resulting in so-called directed probabilistic logical models. In this paper we discuss how to learn non-recursive directed probabilistic logical models from relational data. This problem has already been tackled before by upgrading the structure-search algorithm for learning Bayesian networks. In this paper w...

متن کامل

Generalizing Ordering-search for Learning Directed Probabilistic Logical Models

Recently, there has been an increasing interest in directed probabilistic logical models and a variety of such languages has been proposed. Although many authors provide high-level arguments to show that in principle models in their language can be learned from data, most such learning algorithms have not yet been studied in detail. We propose an algorithm to learn both structure and conditiona...

متن کامل

5 Probabilistic Relational Models

Probabilistic relational models (PRMs) are a rich representation language for structured statistical models. They combine a frame-based logical representation with probabilistic semantics based on directed graphical models (Bayesian networks). This chapter gives an introduction to probabilistic relational models, describing semantics for attribute uncertainty, structural uncertainty, and class ...

متن کامل

On the Relationship between Logical Bayesian Networks and Probabilistic Logic Programming Based on the Distribution Semantics

A significant part of current research on ILP deals with probabilistic logical models. Over the last decade many logics or languages for representing such models have been introduced. There is currently a great need for insight into the relationships between all these languages. One class of languages are those that extend probabilistic models with elements of logic, such as in the language of ...

متن کامل

Popularity Teaching - Ability Course Difficulty Rating Instructor Registration Course

Probabilistic relational models (PRMs) are a rich representation language for structured statistical models. They combine a frame-based logical representation with probabilistic semantics based on directed graphical models (Bayesian networks). This chapter gives an introduction to probabilistic relational models, describing semantics for attribute uncertainty, structural uncertainty, and class ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • AI Commun.

دوره 21  شماره 

صفحات  -

تاریخ انتشار 2008